Facebook under pressure to improve livestream moderation after New Zealand mosque attacks

Facebook under pressure to improve livestream moderation after New Zealand mosque attacks
Facebook has removed 1.5 million videos globally of the New Zealand mosque attack, livestreamed by gunman Brenton Tarrant, above, in the first 24 hours after the attack. (Handout/AFP)
Updated 19 March 2019
Follow

Facebook under pressure to improve livestream moderation after New Zealand mosque attacks

Facebook under pressure to improve livestream moderation after New Zealand mosque attacks

CHRISTCHURCH, New Zealand: Facebook is facing pressure from New Zealand’s advertising industry and the country’s Privacy Commissioner for its role in distributing footage of the Christchurch mosque attacks.

The attacks, which killed 50 people, were live-streamed for almost 17 minutes on Facebook. the attack on two Christchurch mosques.

“Our concern as an industry is that live-streaming of these events becomes the new normal,” said Paul Head, chief executive of New Zealand’s Commercial Communications Council, which represents the country’s advertising agencies.

“We’re asking all of the platforms… to take immediate steps to either put in place systems, processes, algorithms or artificial intelligence that stops this kind of event,” he said.

Lindsay Mouat, chief executive of the Association of New Zealand Advertisers, confirmed some of New Zealand’s “very largest companies” were making changes to their advertising plans in light of the mosque shootings.

Both Head and Mouat agreed social media companies must consider temporarily removing live-streaming capabilities if they were unable to moderate the content.

“This simply cannot be allowed to happen again,” Head said.

New Zealand’s Privacy Commissioner John Edwards said it was appalling that Facebook allowed the gunman to live-stream the attack for 17 minutes.

“There’s no guarantee the same thing won’t happen tomorrow,” Edwards said.

“It is simply incomprehensible and unacceptable that Facebook cannot prevent that kind of content being streamed to the world,” he said.

University of Auckland Computer Science lecturer Dr Paul Ralph said it was extremely difficult for Facebook to implement an automated system to identify a live video showing a violent crime.

“Facebook should never have launched a live-streaming service if they didn’t have a method of stopping a video ... of a terrorist act,” Dr Ralph said.

He penned an open letter, published on noted.co.nz, calling on Facebook and YouTube to confront their role in terrorism.

Live-streaming was “a feature that should have never been launched”, Dr Ralph said.

In a statement, Facebook said it was working around the clock to prevent the shooting video showing up on the platform.

They confirmed the video was uploaded to Facebook 1.5 million times, but 1.2 million of those were stopped at upload, meaning they were never published.